Spring 2013 Statistics 153 ( Time Series ) : Lecture Twenty Three Aditya Guntuboyina
نویسنده
چکیده
When we were fitting ARMA models to the data, we first looked at the sample autocovariance or autocorrelation function and we then tried to find the ARMA model whose theoretical acf matched with the sample acf. Now the sample autocovariance function is a nonparametric estimate of the theoretical autocovariance function of the process. In other words, we first estimated γ(h) nonparametrically by γ̂(h) and then found an ARMA model whose γARMA(h) is close to γ̂(h).
منابع مشابه
Spring 2013 Statistics 153 ( Time Series ) : Lecture Twenty Two Aditya Guntuboyina
Let {Xt} be a stationary sequence of random variables and let γX(h) = cov(Xt, Xt+h) denote the autocovariance function. A theorem due to Herglotz (sometimes attributed to Bochner) states that every autocovariance function γX can be written as: γX(h) = ∫ 1/2 −1/2 edF (λ), where F (·) is a non-negative, right-continuous, non-decreasing function on [−1/2, 1/2] with F (−1/2) = 0 and F (1/2) = γX(0)...
متن کاملSpring 2014 Statistics 210 b ( Theoretical Statistics ) - Lecture One Aditya
1. Some aspects of classical empirical process theory: uniform laws of large numbers, process convergence and uniform central limit theorems. 2. M-estimation. Asymptotic theory of consistency, rates of convergence and limiting distribution. 3. Non-asymptotic theory of penalized empirical risk minimization; nonasymptotic deviation inequalities for suprema of empirical processes, oracle inequalit...
متن کاملSpring 2014 Statistics 210 b ( Theoretical Statistics ) - Lecture Two Aditya
The main idea of upper bounding (3) is to use Rademachers. A Rademacher variable σ simply takes the two values +1 and −1 each with probability 1/2. Let σ1, . . . , σn be n independent Rademachers that are also independent of X1, . . . , Xn and X ′ 1, . . . , X ′ n. For each i, note that the distribution of f(Xi)−f(X ′ i) is the same as the distribution of f(X ′ i)− f(Xi). Therefore, the distrib...
متن کامل